15 research outputs found

    Comparing predictive distributions in EMOS

    Get PDF
    EMOS models are widely used post-processing techniques for obtaining predictive distributions from ensembles for future weather variables. A predictive distribution can be easily obtained by substituting the unknown parameters with suitable estimates in the distribution of the future variable, thus obtaining a so called estimative distribution. Nonetheless, these distributions may perform poorly in terms of coverage probability of the corresponding quantiles. In this work we propose the use of calibrated predictive distributions in the context of EMOS models. The proposed calibrated predictive distribution improves on estimative solutions, producing quantiles with exact coverage level. A simulation study assesses the goodness of the calibrated predictive distribution in terms of coverage probabilities and also logarithmic score and CRPS

    Improved maximum likelihood estimation in heteroscedastic nonlinear regression models.

    Get PDF
    Nonlinear heteroscedastic regression models are a widely used class of models in applied statistics, with applications especially in biology, medicine or chemistry. Nonlinearity and variance heterogeneity can make likelihood estimation for a scalar parameter of interest rather inaccurate for small or moderate samples. In this paper, we suggest a new approach to point estimation based on estimating equations obtained from higher-order pivots for the parameter of interest. In particular, we take as an estimating function the modified directed likelihood. This is a higher-order pivotal quantity that can be easily computed in practice for nonlinear heteroscedastic models with normally distributed errors , using a recently developed S-PLUS library (HOA, 2000) . The estimators obtained from this procedure are a refinement of the maximum likelihood estimators, improving their small sample properties and keeping equivariance under reparameterisation. Two applications to real data sets are discussed

    Robust prediction limits based on M-estimators

    Get PDF
    In this paper we discuss a robust solution to the problem of prediction. Following Barndorff-Nielsen and Cox (1996) and Vidoni (1998), we propose improved prediction limits based on M-estimators instead of maximum likelihood estimators. To compute these robust prediction limits, the expressions of the bias and variance of an M-estimator are required. Here a general asymptotic approximation for the bias of an M-estimator is derived. Moreover, by means of comparative studies in the context of affine transformation models, we show that the proposed robust procedure for prediction behaves in a similar manner to the classical one when the model is correctly specified, but it is designed to be stable in a neighborhood of the model

    A note on simultaneous calibrated prediction intervals for time series

    Get PDF
    This paper deals with simultaneous prediction for time series models. In particular, it presents a simple procedure which gives well-calibrated simultaneous prediction intervals with coverage probability close to the target nominal value. Although the exact computation of the proposed intervals is usually not feasible, an approximation can be easily attained by means of a suitable bootstrap simulation procedure. This new predictive solution is much simpler to compute than those ones already proposed in the literature, based on asymptotic calculations. Applications of the bootstrap calibrated procedure to AR, MA and ARCH models are presented

    Confidence distributions for predictive tail probabilities

    Get PDF
    In this short paper we propose the use of a calibration procedure in order to obtain predictive probabilities for a future random variable of interest. The new calibration method gives rise to a confidence distribution function which probabilities are close to the nominal ones to a high order of approximation. Moreover, the proposed predictive distribution can be easily obtained by means of a bootstrap simulation procedure. A simulation study is presented in order to assess the good properties of our proposal. The calibrated procedure is also applied to a series of real data related to sport records, with the aim of closely estimate the probability of future records

    Robust prediction limits based on M-estimators

    Get PDF
    In this paper we discuss a robust solution to the problem of prediction. Following Barndorff-Nielsen and Cox (1996) and Vidoni (1998), we propose improved prediction limits based on M-estimators instead of maximum likelihood estimators. To compute these robust prediction limits, the expressions of the bias and variance of an M-estimator are required. Here a general asymptotic approximation for the bias of an M-estimator is derived. Moreover, by means of comparative studies in the context of affine transformation models, we show that the proposed robust procedure for prediction behaves in a similar manner to the classical one when the model is correctly specified, but it is designed to be stable in a neighborhood of the model

    On the relationship between alpha connections and the asymptotic properties of predictive distributions

    Get PDF
    In a recent paper, Komaki studied the second-order asymptotic properties of predictive distributions, using the Kullback-Leibler divergence as a loss function. He showed that estimative distributions with asymptotically efficient estimators can be improved by predictive distributions that do not belong to the model. The model is assumed to be a multidimensional curved exponential family. In this paper we generalize the result assuming as a loss function any f divergence. A relationship arises between alpha connections and optimal predictive distributions. In particular, using an alpha divergence to measure the goodness of a predictive distribution, the optimal shift of the estimate distribution is related to alpha-covariant derivatives. The expression that we obtain for the asymptotic risk is also useful to study the higher-order asymptotic properties of an estimator, in the mentioned class of loss functions

    Multivariate prediction

    Get PDF
    The problem of prediction is considered in a multidimensional setting. Extending an idea presented by Barndorff-Nielsen and Cox, a predictive density for a multivariate random variable of interest is proposed. This density has the form of an estimative density plus a correction term. It gives simultaneous prediction regions with coverage error of smaller asymptotic order than the estimative density. A simulation study is also presented showing the magnitude of the improvement with respect to the estimative method

    Improved maximum likelihood estimation in heteroscedastic nonlinear regression models.

    No full text
    Nonlinear heteroscedastic regression models are a widely used class of models in applied statistics, with applications especially in biology, medicine or chemistry. Nonlinearity and variance heterogeneity can make likelihood estimation for a scalar parameter of interest rather inaccurate for small or moderate samples. In this paper, we suggest a new approach to point estimation based on estimating equations obtained from higher-order pivots for the parameter of interest. In particular, we take as an estimating function the modified directed likelihood. This is a higher-order pivotal quantity that can be easily computed in practice for nonlinear heteroscedastic models with normally distributed errors , using a recently developed S-PLUS library (HOA, 2000) . The estimators obtained from this procedure are a refinement of the maximum likelihood estimators, improving their small sample properties and keeping equivariance under reparameterisation. Two applications to real data sets are discussed
    corecore